Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add filters








Language
Year range
1.
Chinese Medical Journal ; (24): 821-828, 2021.
Article in English | WPRIM | ID: wpr-878109

ABSTRACT

BACKGROUND@#Colorectal cancer is harmful to the patient's life. The treatment of patients is determined by accurate preoperative staging. Magnetic resonance imaging (MRI) played an important role in the preoperative examination of patients with rectal cancer, and artificial intelligence (AI) in the learning of images made significant achievements in recent years. Introducing AI into MRI recognition, a stable platform for image recognition and judgment can be established in a short period. This study aimed to establish an automatic diagnostic platform for predicting preoperative T staging of rectal cancer through a deep neural network.@*METHODS@#A total of 183 rectal cancer patients' data were collected retrospectively as research objects. Faster region-based convolutional neural networks (Faster R-CNN) were used to build the platform. And the platform was evaluated according to the receiver operating characteristic (ROC) curve.@*RESULTS@#An automatic diagnosis platform for T staging of rectal cancer was established through the study of MRI. The areas under the ROC curve (AUC) were 0.99 in the horizontal plane, 0.97 in the sagittal plane, and 0.98 in the coronal plane. In the horizontal plane, the AUC of T1 stage was 1, AUC of T2 stage was 1, AUC of T3 stage was 1, AUC of T4 stage was 1. In the coronal plane, AUC of T1 stage was 0.96, AUC of T2 stage was 0.97, AUC of T3 stage was 0.97, AUC of T4 stage was 0.97. In the sagittal plane, AUC of T1 stage was 0.95, AUC of T2 stage was 0.99, AUC of T3 stage was 0.96, and AUC of T4 stage was 1.00.@*CONCLUSION@#Faster R-CNN AI might be an effective and objective method to build the platform for predicting rectal cancer T-staging.@*TRIAL REGISTRATION@#chictr.org.cn: ChiCTR1900023575; http://www.chictr.org.cn/showproj.aspx?proj=39665.


Subject(s)
Humans , Artificial Intelligence , Magnetic Resonance Imaging , Neoplasm Staging , Neural Networks, Computer , Rectal Neoplasms/pathology , Retrospective Studies
2.
Chinese Medical Journal ; (24): 2804-2811, 2019.
Article in English | WPRIM | ID: wpr-781740

ABSTRACT

BACKGROUND@#Artificial intelligence-assisted image recognition technology is currently able to detect the target area of an image and fetch information to make classifications according to target features. This study aimed to use deep neural networks for computed tomography (CT) diagnosis of perigastric metastatic lymph nodes (PGMLNs) to simulate the recognition of lymph nodes by radiologists, and to acquire more accurate identification results.@*METHODS@#A total of 1371 images of suspected lymph node metastasis from enhanced abdominal CT scans were identified and labeled by radiologists and were used with 18,780 original images for faster region-based convolutional neural networks (FR-CNN) deep learning. The identification results of 6000 random CT images from 100 gastric cancer patients by the FR-CNN were compared with results obtained from radiologists in terms of their identification accuracy. Similarly, 1004 CT images with metastatic lymph nodes that had been post-operatively confirmed by pathological examination and 11,340 original images were used in the identification and learning processes described above. The same 6000 gastric cancer CT images were used for the verification, according to which the diagnosis results were analyzed.@*RESULTS@#In the initial group, precision-recall curves were generated based on the precision rates, the recall rates of nodule classes of the training set and the validation set; the mean average precision (mAP) value was 0.5019. To verify the results of the initial learning group, the receiver operating characteristic curves was generated, and the corresponding area under the curve (AUC) value was calculated as 0.8995. After the second phase of precise learning, all the indicators were improved, and the mAP and AUC values were 0.7801 and 0.9541, respectively.@*CONCLUSION@#Through deep learning, FR-CNN achieved high judgment effectiveness and recognition accuracy for CT diagnosis of PGMLNs.@*TRIAL REGISTRATION@#Chinese Clinical Trial Registry, No. ChiCTR1800016787; http://www.chictr.org.cn/showproj.aspx?proj=28515.

SELECTION OF CITATIONS
SEARCH DETAIL